On kernel smoothing for extremal quantile regression
نویسندگان
چکیده
Nonparametric regression quantiles obtained by inverting a kernel estimator of the conditional distribution of the response are long established in statistics. Attention has been, however, restricted to ordinary quantiles staying away from the tails of the conditional distribution. The purpose of this paper is to extend their asymptotic theory far enough into the tails. We focus on extremal quantile regression estimators of a response variable given a vector of covariates in the general setting, whether the conditional extreme-value index is positive, negative, or zero. Specifically, we elucidate their limit distributions when they are located in the range of the data or near and even beyond the sample boundary, under technical conditions that link the speed of convergence of their (intermediate or extreme) order with the oscillations of the quantile function and a von-Mises property of the conditional distribution. A simulation experiment and an illustration on real data were presented. The real data are the American electric data where the estimation of conditional extremes is found to be of genuine interest.
منابع مشابه
A Closer Examination of Extreme Value Theory Modeling in Value-at-Risk Estimation
Extreme value theory has been widely used for modeling the tails of return distribution. Generalized Pareto distribution (GPD) is popularly acknowledged as one of the major tools in Value-at-Risk (VaR) estimation. As Basel II stipulates the significance level for VaR estimation from previous 5% quantile level to more extremal quantile levels at 1%, it demands a more accurate estimation approach...
متن کاملOn Quantile Regression in Reproducing Kernel Hilbert Spaces with the Data Sparsity Constraint
For spline regressions, it is well known that the choice of knots is crucial for the performance of the estimator. As a general learning framework covering the smoothing splines, learning in a Reproducing Kernel Hilbert Space (RKHS) has a similar issue. However, the selection of training data points for kernel functions in the RKHS representation has not been carefully studied in the literature...
متن کاملNonparametric Test for Checking Lack-of-Fit of Quantile Regression Model under Random Censoring
Recently, considerable attention has been devoted to quantile regression under random censoring in both statistical and econometrical literature yet little has been done on the important problem of model checking. This paper proposes a nonparametric test for checking the lack-of-fit of the quantile function of the survival time given the covariates when the survival time is subjected to random ...
متن کاملUniform Bahadur Representation for Nonparametric Censored Quantile Regression: A Redistribution-of-Mass Approach
Censored quantile regressions have received a great deal of attention in the literature. In a linear setup, recent research has found that an estimator based on the idea of “redistribution-of-mass” (Efron, 1967) has better numerical performance than other available methods. In this paper, this idea is combined with the local polynomial kernel smoothing for nonparametric quantile regression of c...
متن کاملEstimating Densities, Quantiles, Quantile Densities and Density Quantiles
Abs t rac t . To estimate the quantile density function (the derivative of the quantile function) by kernel means, there are two alternative approaches. One is the derivative of the kernel quantile estimator, the other is essentially the reciprocal of the kernel density estimator. We give ways in which the former method has certain advantages over the latter. Various closely related smoothing i...
متن کامل